numerical problem
Review for NeurIPS paper: Normalizing Kalman Filters for Multivariate Time Series Analysis
Additional Feedback: I very much enjoyed the paper, and I congratulate the authors on their work. The experimental results by themselves do not necessarily provide a compelling reason to use NKF over previous models. But I think the idea is an important and straight-forward one, addressing a variety of weaknesses of previous models. On the other hand, the omission of any discussion of the non-additive noise seemed problematic to me. Minor comments: [L262]: [5] is a textbook, please be more specific in the ref.
ComputeGPT: A computational chat model for numerical problems
Lewis, Ryan Hardesty, Jiao, Junfeng
Language models have made significant strides in recent years, becoming proficient at understanding and generating human-like text [26, 2]. However, despite their advances, traditional language models remain inaccurate in solving numerical problems, as their architecture relies on predicting the next word based on probability rather than executing calculations [3]. This paper introduces ComputeGPT, an innovative chat model capable of addressing computational problems by running on-demand code. ComputeGPT parses each question into relevant code, executes the code, and returns the computed answer as part of the chat. We combine this approach with a local browserbased Python interpreter, Pyiodide, and fine-tuned prompts to achieve state-of-the-art efficiency in solving numerical problems while providing a suitable and safe environment for code execution.
BETULA: Numerically Stable CF-Trees for BIRCH Clustering
Lang, Andreas, Schubert, Erich
BIRCH clustering is a widely known approach for clustering, that has influenced much subsequent research and commercial products. The key contribution of BIRCH is the Clustering Feature tree (CF-Tree), which is a compressed representation of the input data. As new data arrives, the tree is eventually rebuilt to increase the compression. Afterward, the leaves of the tree are used for clustering. Because of the data compression, this method is very scalable. The idea has been adopted for example for k-means, data stream, and density-based clustering. Clustering features used by BIRCH are simple summary statistics that can easily be updated with new data: the number of points, the linear sums, and the sum of squared values. Unfortunately, how the sum of squares is then used in BIRCH is prone to catastrophic cancellation. We introduce a replacement cluster feature that does not have this numeric problem, that is not much more expensive to maintain, and which makes many computations simpler and hence more efficient. These cluster features can also easily be used in other work derived from BIRCH, such as algorithms for streaming data. In the experiments, we demonstrate the numerical problem and compare the performance of the original algorithm compared to the improved cluster features.
- Europe > United Kingdom (0.28)
- North America > United States > Wisconsin > Dane County > Madison (0.04)
- Europe > Germany > North Rhine-Westphalia > Arnsberg Region > Dortmund (0.04)
Rejoinder for "Probabilistic Integration: A Role in Statistical Computation?"
Briol, Francois-Xavier, Oates, Chris J., Girolami, Mark, Osborne, Michael A., Sejdinovic, Dino
This article is the rejoinder for the paper "Probabilistic Integration: A Role in Statistical Computation?" to appear in Statistical Science with discussion [Briol et al., 2015]. We would first like to thank the reviewers and many of our colleagues who helped shape this paper, the editor for selecting our paper for discussion, and of course all of the discussants for their thoughtful, insightful and constructive comments. In this rejoinder, we respond to some of the points raised by the discussants and comment further on the fundamental questions underlying the paper: - Should Bayesian ideas be used in numerical analysis? Numerical analysis is concerned with the approximation of typically high or infinite-dimensional mathematical quantities using discretisations of the space on which these are defined. Different discretisation schemes lead to different numerical algorithms, whose stability and convergence properties need to be carefully assessed.
- Asia > Japan > Honshū > Kantō > Kanagawa Prefecture (0.05)
- Europe > United Kingdom > England > Oxfordshire > Oxford (0.05)
- Information Technology > Mathematics of Computing (1.00)
- Information Technology > Artificial Intelligence > Representation & Reasoning > Uncertainty > Bayesian Inference (1.00)
- Information Technology > Artificial Intelligence > Machine Learning > Learning Graphical Models > Directed Networks > Bayesian Learning (0.96)
From Solving Equations to Deep Learning: A TensorFlow Python Tutorial
There have been some remarkable developments lately in the world of artificial intelligence, from much publicized progress with self-driving cars to machines now composing Chopin imitations or just being really good at video games. Central to these advances are a number of tools around to help derive deep learning and other machine learning models, with Torch, Caffe, and Theano amongst those at the fore. However, since Google Brain went open source in November 2015 with their own framework, TensorFlow, we have seen the popularity of this software library skyrocket to be the most popular deep learning framework. Reasons include the wealth of support and documentation available, its production readiness, the ease of distributing calculations across a range of devices, and an excellent visualization tool: TensorBoard. Ultimately, TensorFlow manages to combine a comprehensive and flexible set of technical features with great ease of use.